Kullback–Leibler divergence and the Pareto–Exponential approximation
نویسنده
چکیده
Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model assumption, simpler radar detection schemes can be applied. An information theory approach is introduced to investigate the Pareto-Exponential approximation. By analysing the Kullback-Leibler divergence between the two distributions it is possible to not only assess when the approximation is valid, but to determine, for a given Pareto model, the optimal Exponential approximation.
منابع مشابه
Statistics of Extremes by Oracle Estimation
We use the fitted Pareto law to construct an accompanying approximation of the excess distribution function. A selection rule of the location of the excess distribution function is proposed based on a stagewise lack-of-fit testing procedure. Our main result is an oracle type inequality for the Kullback–Leibler loss.
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملUse of Kullback–Leibler divergence for forgetting
Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, us...
متن کاملFixed-Form Variational Posterior Approximation through Stochastic Linear Regression
We propose a general algorithm for approximating nonstandard Bayesian posterior distributions. The algorithm minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribution. Our method can be used to approximate any posterior distribution, provided that it is given in closed form up to the proportionality constant. The approximation can be an...
متن کاملMin-Max Kullback-Leibler Model Selection
This paper considers an information theoretic min-max approach to the model selection problem. The aim of this approach is to select the member of a given parameterized family of probability models so as to minimize the worst-case KullbackLeibler divergence from an uncertain “truth” model. Uncertainty of the truth is specified by an upper-bound of the KL-divergence relative to a given reference...
متن کامل